Weighted Coordinate-Wise Pegasos

نویسندگان

  • Vilen Jumutc
  • Johan A. K. Suykens
چکیده

Pegasos is a popular and reliable machine learning algorithm for making linear Support Vector Machines solvable at the larger scale. It benefits from the strongly convex optimization objective, faster convergence rates and lower computational and memory costs. In this paper we devise a new weighted formulation of the Pegasos algorithm which favors from the different coordinate-wise λi regularization parameters. Together with the proposed extension we give a brief theoretical justification of its convergence to an optimal solution and analyze at a glance its computational costs. We conclude our paper with the numerical results obtained for UCI datasets and demonstrate the merits and the importance of our approach for achieving a better classification accuracy and convergence rates in the partially or fully stochastic setting.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Coordinate Descent Method for Large-scale L2-loss Linear SVM

Linear support vector machines (SVM) are useful for classifying largescale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem wh...

متن کامل

Coordinate Descent Method for Large-scale L2-loss Linear Support Vector Machines

Linear support vector machines (SVM) are useful for classifying large-scale sparse data. Problems with sparse features are common in applications such as document classification and natural language processing. In this paper, we propose a novel coordinate descent algorithm for training linear SVM with the L2-loss function. At each step, the proposed method minimizes a one-variable sub-problem w...

متن کامل

Conflict Graphs for Parallel Stochastic Gradient Descent

We present various methods for inducing a conflict graph in order to effectively parallelize Pegasos. Pegasos is a stochastic sub-gradient descent algorithm for solving the Support Vector Machine (SVM) optimization problem [3]. In particular, we introduce a binary treebased conflict graph that matches convergence of a wellknown parallel implementation of stochastic gradient descent, know as HOG...

متن کامل

Multi-Class Pegasos on a Budget

When equipped with kernel functions, online learning algorithms are susceptible to the “curse of kernelization” that causes unbounded growth in the model size. To address this issue, we present a family of budgeted online learning algorithms for multi-class classification which have constant space and time complexity per update. Our approach is based on the multi-class version of the popular Pe...

متن کامل

Stochastic Gradient Twin Support Vector Machine for Large Scale Problems

Stochastic gradient descent algorithm has been successfully applied on support vector machines (called PEGASOS) for many classification problems. In this paper, stochastic gradient descent algorithm is investigated to twin support vector machines for classification. Compared with PEGASOS, the proposed stochastic gradient twin support vector machines (SGTSVM) is insensitive on stochastic samplin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013